100 research outputs found

    Delayed acceptance ABC-SMC

    Get PDF
    Approximate Bayesian computation (ABC) is now an established technique for statistical inference used in cases where the likelihood function is computationally expensive or not available. It relies on the use of a~model that is specified in the form of a~simulator, and approximates the likelihood at a~parameter value θ\theta by simulating auxiliary data sets xx and evaluating the distance of xx from the true data yy. However, ABC is not computationally feasible in cases where using the simulator for each θ\theta is very expensive. This paper investigates this situation in cases where a~cheap, but approximate, simulator is available. The approach is to employ delayed acceptance Markov chain Monte Carlo (MCMC) within an ABC sequential Monte Carlo (SMC) sampler in order to, in a~first stage of the kernel, use the cheap simulator to rule out parts of the parameter space that are not worth exploring, so that the ``true'' simulator is only run (in the second stage of the kernel) where there is a~reasonable chance of accepting proposed values of θ\theta. We show that this approach can be used quite automatically, with few tuning parameters. Applications to stochastic differential equation models and latent doubly intractable distributions are presented

    Sequential Monte Carlo with transformations

    Get PDF
    This paper examines methodology for performing Bayesian inference sequentially on a sequence of posteriors on spaces of different dimensions. For this, we use sequential Monte Carlo samplers, introducing the innovation of using deterministic transformations to move particles effectively between target distributions with different dimensions. This approach, combined with adaptive methods, yields an extremely flexible and general algorithm for Bayesian model comparison that is suitable for use in applications where the acceptance rate in reversible jump Markov chain Monte Carlo is low. We use this approach on model comparison for mixture models, and for inferring coalescent trees sequentially, as data arrives

    Online bayesian inference in some time-frequency representations of non-stationary processes

    Get PDF
    The use of Bayesian inference in the inference of time-frequency representations has, thus far, been limited to offline analysis of signals, using a smoothing spline based model of the time-frequency plane. In this paper we introduce a new framework that allows the routine use of Bayesian inference for online estimation of the time-varying spectral density of a locally stationary Gaussian process. The core of our approach is the use of a likelihood inspired by a local Whittle approximation. This choice, along with the use of a recursive algorithm for non-parametric estimation of the local spectral density, permits the use of a particle filter for estimating the time-varying spectral density online. We provide demonstrations of the algorithm through tracking chirps and the analysis of musical data

    Rare event ABC-SMC2^{2}

    Full text link
    Approximate Bayesian computation (ABC) is a well-established family of Monte Carlo methods for performing approximate Bayesian inference in the case where an ``implicit'' model is used for the data: when the data model can be simulated, but the likelihood cannot easily be pointwise evaluated. A fundamental property of standard ABC approaches is that the number of Monte Carlo points required to achieve a given accuracy scales exponentially with the dimension of the data. Prangle et al. (2018) proposes a Markov chain Monte Carlo (MCMC) method that uses a rare event sequential Monte Carlo (SMC) approach to estimating the ABC likelihood that avoids this exponential scaling, and thus allows ABC to be used on higher dimensional data. This paper builds on the work of Prangle et al. (2018) by using the rare event SMC approach within an SMC algorithm, instead of within an MCMC algorithm. The new method has a similar structure to SMC2^{2} (Chopin et al., 2013), and requires less tuning than the MCMC approach. We demonstrate the new approach, compared to existing ABC-SMC methods, on a toy example and on a duplication-divergence random graph model used for modelling protein interaction networks

    Ensemble MCMC: Accelerating Pseudo-Marginal MCMC for State Space Models using the Ensemble Kalman Filter

    Get PDF
    Particle Markov chain Monte Carlo (pMCMC) is now a popular method for performing Bayesian statistical inference on challenging state space models (SSMs) with unknown static parameters. It uses a particle filter (PF) at each iteration of an MCMC algorithm to unbiasedly estimate the likelihood for a given static parameter value. However, pMCMC can be computationally intensive when a large number of particles in the PF is required, such as when the data are highly informative, the model is misspecified and/or the time series is long. In this paper we exploit the ensemble Kalman filter (EnKF) developed in the data assimilation literature to speed up pMCMC. We replace the unbiased PF likelihood with the biased EnKF likelihood estimate within MCMC to sample over the space of the static parameter. On a wide class of different non-linear SSM models, we demonstrate that our extended ensemble MCMC (eMCMC) methods can significantly reduce the computational cost whilst maintaining reasonable accuracy. We also propose several extensions of the vanilla eMCMC algorithm to further improve computational efficiency. Computer code to implement our methods on all the examples can be downloaded from https://github.com/cdrovandi/Ensemble-MCMC
    • …
    corecore